Cross-Situational Learning with Bayesian Generative Models for Multimodal Category and Word Learning in Robots
نویسندگان
چکیده
In this paper, we propose a Bayesian generative model that can form multiple categories based on each sensory-channel and can associate words with any of the four sensory-channels (action, position, object, and color). This paper focuses on cross-situational learning using the co-occurrence between words and information of sensory-channels in complex situations rather than conventional situations of cross-situational learning. We conducted a learning scenario using a simulator and a real humanoid iCub robot. In the scenario, a human tutor provided a sentence that describes an object of visual attention and an accompanying action to the robot. The scenario was set as follows: the number of words per sensory-channel was three or four, and the number of trials for learning was 20 and 40 for the simulator and 25 and 40 for the real robot. The experimental results showed that the proposed method was able to estimate the multiple categorizations and to learn the relationships between multiple sensory-channels and words accurately. In addition, we conducted an action generation task and an action description task based on word meanings learned in the cross-situational learning scenario. The experimental results showed that the robot could successfully use the word meanings learned by using the proposed method.
منابع مشابه
Linking Learning to Looking: Habituation and Association in Infant Statistical Language Learning
Recent experiments have shown the importance of statistical learning in infant language acquisition. Computational models of such learning, however, often take the form of corpus analyses and are thus difficult to connect to empirical data. We report a cross-situational learning experiment which demonstrates robust individual differences in learning between infants. We then present a novel gene...
متن کاملIntegrating Syntactic Knowledge into a Model of Cross-situational Word Learning
It has been suggested that children learn the meanings of words by observing the regularities across different situations in which a word is used. However, experimental studies show that children are also sensitive to the syntactic properties of words and their context at a young age, and can use this information to find the correct referent for novel words. We present a unified computational m...
متن کاملPragmatically Framed Cross-Situational Noun Learning Using Computational Reinforcement Models
Cross-situational learning and social pragmatic theories are prominent mechanisms for learning word meanings (i.e., word-object pairs). In this paper, the role of reinforcement is investigated for early word-learning by an artificial agent. When exposed to a group of speakers, the agent comes to understand an initial set of vocabulary items belonging to the language used by the group. Both cros...
متن کاملSupplementary material for “ Using speakers ’ referential intentions to model early cross - situational word learning ”
This supplementary document accompanies the Bayesian model of cross-situational word learning described in our paper “Using speakers’ referential intentions to model early cross-situational word learning.” It is also meant to be read in conjunction with the source code for our model and the comparison models we describe in the paper. Together we hope these materials give the interested reader a...
متن کاملSimultaneous Noun and Category Learning via Cross-Situational Statistics
Previous research shows that people can acquire an impressive number of word-referent pairs after viewing a series of ambiguous trials by accumulating co-occurrence statistics (e.g., Yu & Smith, 2006). The present study extends the cross-situational word learning paradigm, which has previously dealt only with noun acquisition, and shows that humans can concurrently acquire nouns and adjectives ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 11 شماره
صفحات -
تاریخ انتشار 2017